Liu M Y, Tuzel O. Coupled generative adversarial networks[C]//Advances in neural information processing systems. 2016: 469-477.
1. Overview
In this paper, it proposed coupled GAN
- based on existence of shared high-level representation in the domains
 - learn a joint distribution of multi-domain images
 - unsupervised
 - used in image transformation, domain adaption…
 
1.1. Model

Generator
- share the same high-level concept
Discriminator - sharing constraint can reduce parameters
 
1.2. Loss Function


1.3. Related Work
- VAE
 - Attention Model
 - Moment Matching
 - Diffusion Process
 - Cross-domian Image Generation
 - GAN
- Laplacian Pyramid
 - Conditional GAN
 
 
2. Experiments
2.1. Metric
- ratios of agreed pixels
 
2.2. Digit

- [digit-edge], [digit-negative]
 - without weight-sharing constraint, GAN generate unrelated image
 

- correlated to the weight sharing of G
 - uncorrelated to D
 
2.3. Face
2.4. Color and Depth Image
3. Application
3.1. Unsupervised Domain Adaption (UDA)

- [MNIST(labeled)-UDA(unlabeled)]
 - attached a softmax layer c to last hidden layer of D, train on MNIST, predict on UDA
 

3.2. Cross-Domain Image Transformation
Given x1 in domain 1, find corresponding image x2 in domain 2. As for CoGAN

- get the most suitable z* for x1
 - use z_* generate x2